Goto

Collaborating Authors

 triangular map




Neural Triangular Transport Maps: A New Approach Towards Sampling in Lattice QCD

Bryutkin, Andrey, Marzouk, Youssef

arXiv.org Machine Learning

Lattice field theories are fundamental testbeds for computational physics; yet, sampling their Boltzmann distributions remains challenging due to multimodality and long-range correlations. While normalizing flows offer a promising alternative, their application to large lattices is often constrained by prohibitive memory requirements and the challenge of maintaining sufficient model expressivity. We propose sparse triangular transport maps that explicitly exploit the conditional independence structure of the lattice graph under periodic boundary conditions using monotone rectified neural networks (MRNN). We introduce a comprehensive framework for triangular transport maps that navigates the fundamental trade-off between \emph{exact sparsity} (respecting marginal conditional independence in the target distribution) and \emph{approximate sparsity} (computational tractability without fill-ins). Restricting each triangular map component to a local past enables site-wise parallel evaluation and linear time complexity in lattice size $N$, while preserving the expressive, invertible structure. Using $ϕ^4$ in two dimensions as a controlled setting, we analyze how node labelings (orderings) affect the sparsity and performance of triangular maps. We compare against Hybrid Monte Carlo (HMC) and established flow approaches (RealNVP).





An invertible generative model for forward and inverse problems

van Leeuwen, Tristan, Brune, Christoph, Carioni, Marcello

arXiv.org Machine Learning

We formulate the inverse problem in a Bayesian framework and aim to train a generative model that allows us to simulate (i.e., sample from the likelihood) and do inference (i.e., sample from the posterior). We review the use of triangular normalizing flows for conditional sampling in this context and show how to combine two such triangular maps (an upper and a lower one) in to one invertible mapping that can be used for simulation and inference. We work out several useful properties of this invertible generative model and propose a possible training loss for training the map directly. We illustrate the workings of this new approach to conditional generative modeling numerically on a few stylized examples.



A friendly introduction to triangular transport

Ramgraber, Maximilian, Sharp, Daniel, Provost, Mathieu Le, Marzouk, Youssef

arXiv.org Machine Learning

Decision making under uncertainty is a cross-cutting challenge in science and engineering. Most approaches to this challenge employ probabilistic representations of uncertainty. In complicated systems accessible only via data or black-box models, however, these representations are rarely known. We discuss how to characterize and manipulate such representations using triangular transport maps, which approximate any complex probability distribution as a transformation of a simple, well-understood distribution. The particular structure of triangular transport guarantees many desirable mathematical and computational properties that translate well into solving practical problems. Triangular maps are actively used for density estimation, (conditional) generative modelling, Bayesian inference, data assimilation, optimal experimental design, and related tasks. While there is ample literature on the development and theory of triangular transport methods, this manuscript provides a detailed introduction for scientists interested in employing measure transport without assuming a formal mathematical background. We build intuition for the key foundations of triangular transport, discuss many aspects of its practical implementation, and outline the frontiers of this field.


Review for NeurIPS paper: Fast and Flexible Temporal Point Processes with Triangular Maps

Neural Information Processing Systems

Summary and Contributions: This work first proposes a new parametrization for several classic temporal point processes (TPPs), which enables efficient parallel likelihood computation and sampling. TPP allows to naturally handle data that consists of variable-number events in continuous time. These classic TTP models with existing parametrization was inherently sequential. Next, the authors proposed a new class of non-recurrent TPP models, namely TriTPP, where both sampling and likelihood computation can be done in parallel. TPP models combined with recurrent neural networks provide a highly flexible powerful framework, but still remain sequential, making TPPs poorly suited for sampling.